Minimax-Optimal Rates For Sparse Additive Models Over Kernel Classes Via Convex Programming
نویسندگان
چکیده
Sparse additive models are families of d-variate functions that have the additive decomposition f∗ = ∑ j∈S f ∗ j , where S is a unknown subset of cardinality s d. We consider the case where each component function f∗ j lies in a reproducing kernel Hilbert space, and analyze a simple kernel-based convex program for estimating the unknown function f∗. Working within a high-dimensional framework that allows both the dimension d and sparsity s to scale, we derive convergence rates in the L(P) and L(Pn) norms. These rates consist of two terms: a subset selection term of the order s log d n , corresponding to the difficulty of finding the unknown s-sized subset, and an estimation error term of the order s ν n, where ν 2 n is the optimal rate for estimating an univariate function within the RKHS. We complement these achievable results by deriving minimax lower bounds on the L(P) error, thereby showing that our method is optimal up to constant factors for sub-linear sparsity s = o(d). Thus, we obtain optimal minimax rates for many interesting classes of sparse additive models, including polynomials, splines, finite-rank kernel classes, as well as Sobolev smoothness classes.
منابع مشابه
Minimax-optimal rates for high-dimensional sparse additive models over kernel classes
Sparse additive models are families of d-variate functions that have the additive decomposition f = ∑ j∈S f ∗ j , where S is an unknown subset of cardinality s ≪ d. We consider the case where each component function f j lies in a reproducing kernel Hilbert space, and analyze an l1 kernel-based method for estimating the unknown function f . Working within a highdimensional framework that allows ...
متن کاملOn the Minimax Optimality of Block Thresholded Wavelets Estimators for ?-Mixing Process
We propose a wavelet based regression function estimator for the estimation of the regression function for a sequence of ?-missing random variables with a common one-dimensional probability density function. Some asymptotic properties of the proposed estimator based on block thresholding are investigated. It is found that the estimators achieve optimal minimax convergence rates over large class...
متن کاملMinimax Optimal Rates of Estimation in High Dimensional Additive Models: Universal Phase Transition
We establish minimax optimal rates of convergence for estimation in a high dimensional additive model assuming that it is approximately sparse. Our results reveal an interesting phase transition behavior universal to this class of high dimensional problems. In the sparse regime when the components are sufficiently smooth or the dimensionality is sufficiently large, the optimal rates are identic...
متن کاملMinimax Estimation via Wavelet Shrinkage
We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebeland Besov-type smoothness constraints, and asymptoti...
متن کاملSparse additive regression on a regular lattice
We consider estimation in a sparse additive regression model with the design points on a regular lattice. We establish the minimax convergence rates over Sobolev classes and propose a Fourier-based rate optimal estimator which is adaptive to the unknown sparsity and smoothness of the response function. The estimator is derived within a Bayesian formalism but can be naturally viewed as a penaliz...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 13 شماره
صفحات -
تاریخ انتشار 2012